Estimating the Entropy Rate of Spike Trains

نویسندگان

  • Yun Gao
  • Ioannis Kontoyiannis
  • Elie Bienenstock
چکیده

Information-theoretic methods have been widely used in neuroscience, in the broad effort to analyze and understand the fundamental informationprocessing tasks performed by the brain. In these studies, the entropy has been adopted as the main measure for quantifying the amount of information transmitted between neurons, via the spike trains they generate. One of the first and most important goals is to identify appropriate methods that can be used to quantify the amount of information that gets communicated by spike trains, or, in other words, to estimate the entropy of spike trains recorded from live animals. So far, the most commonly used entropyestimation technique has been the so-called “plugin” (or maximum-likelihood) estimator and its various modifications. This method consists of essentially calculating the empirical frequencies of all words of a fixed length in the data, and then estimating the “true” entropy of the underlying signal as the entropy of this empirical distribution; see, e.g., [10][5][12][6][9]. For computational reasons, the plug-in estimator cannot go beyond word lengths of about 10 or 20, and hence it does not take into account the potential longer time dependencies in the signal. Here we examine the performance of entropy estimators based on two data compression algorithms, the Lempel-Ziv algorithm (LZ) and the Context Tree Weighting method (CTW). Specifically, we consider two LZ-based entropy estimators and one based on the CTW. The first LZ-based method has been widely and very successfully used in many applications, and the other one is a new estimator with some novel and more desirable statistical properties. The CTW-based estimator is based in the work of Willems et al [13][14][15] and it has also been considered in [1][3].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity

Normalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discha...

متن کامل

Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performanc...

متن کامل

Spike train entropy-rate estimation using hierarchical Dirichlet process priors

Entropy rate quantifies the amount of disorder in a stochastic process. For spiking neurons, the entropy rate places an upper bound on the rate at which the spike train can convey stimulus information, and a large literature has focused on the problem of estimating entropy rate from spike train data. Here we present Bayes least squares and empirical Bayesian entropy rate estimators for binary s...

متن کامل

Differential Entropy of Multivariate Neural Spike Trains

Most approaches to analysing the spatiotemporal dynamics of neural populations involve binning spike trains. This is likely to underestimate the information carried by spike timing codes, in practice, if they involve high precision patterns of inter-spike intervals (ISIs). In this paper we set out to investigate the differential entropy of multivariate neural spike trains, following the work of...

متن کامل

Estimating Information Rates with Confidence Intervals in Neural Spike Trains

Information theory provides a natural set of statistics to quantify the amount of knowledge a neuron conveys about a stimulus. A related work (Kennel, Shlens, Abarbanel, & Chichilnisky, 2005) demonstrated how to reliably estimate, with a Bayesian confidence interval, the entropy rate from a discrete, observed time series. We extend this method to measure the rate of novel information that a neu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004